翻訳と辞書
Words near each other
・ Cohen on the Telephone
・ Cohen ring
・ Cohen Saves the Flag
・ Cohen Stadium
・ Cohen structure theorem
・ Cohen syndrome
・ Cohen v Segal
・ Cohen v. Beneficial Industrial Loan Corp.
・ Cohen v. California
・ Cohen v. Cowles Media Co.
・ Cohen vs. Rosi
・ Cohen's cryptosystem
・ Cohen's Fashion Optical
・ Cohen's h
・ Cohen's horseshoe bat
Cohen's kappa
・ Cohen-Daubechies-Feauveau wavelet
・ Cohenite
・ Cohennoz
・ Cohens v. Virginia
・ Cohen–Hewitt factorization theorem
・ Cohen–Macaulay ring
・ Cohen–Sutherland algorithm
・ Coherence
・ Coherence (film)
・ Coherence (linguistics)
・ Coherence (philosophical gambling strategy)
・ Coherence (physics)
・ Coherence (signal processing)
・ Coherence (statistics)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Cohen's kappa : ウィキペディア英語版
Cohen's kappa
Cohen's kappa coefficient is a statistic which measures inter-rater agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation, since κ takes into account the agreement occurring by chance.
==Calculation==

Cohen's kappa measures the agreement between two raters who each classify ''N'' items into ''C'' mutually exclusive categories. The first mention of a kappa-like statistic is attributed to Galton (1892),〔Galton, F. (1892). ''Finger Prints'' Macmillan, London.〕 see Smeeton (1985).
The equation for κ is:
:\kappa = \frac = 1- \frac, \!
where is the relative observed agreement among raters, and is the hypothetical probability of chance agreement, using the observed data to calculate the probabilities of each observer randomly saying each category. If the raters are in complete agreement then . If there is no agreement among the raters other than what would be expected by chance (as given by ), .
The seminal paper introducing kappa as a new technique was published by Jacob Cohen in the journal ''Educational and Psychological Measurement'' in 1960.
A similar statistic, called pi, was proposed by Scott (1955). Cohen's kappa and Scott's pi differ in terms of how is calculated.
Note that Cohen's kappa measures agreement between two raters only. For a similar measure of agreement (Fleiss' kappa) used when there are more than two raters, see Fleiss (1971). The Fleiss kappa, however, is a multi-rater generalization of Scott's pi statistic, not Cohen's kappa. Kappa is also used to compare performance in Machine Learning but the directional version known as Informedness or Youden's J statistic is argued to be more appropriate for supervised learning.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Cohen's kappa」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.